Skip to content

Conversation

deependujha
Copy link
Collaborator

@deependujha deependujha commented Oct 6, 2025

What does this PR do?

Partially addresses #19658

This PR introduces a new callback hook on_before_optimizer_setup to provide a safe point between configure_model() and configure_optimizers().

It’s primarily designed to fix incompatibility between BaseFinetuning and LightningModules that define submodules inside configure_model (e.g., FSDP, DeepSpeed, etc.).


⚙️ Hook order (new)

setup
→ configure_model
→ on_before_optimizer_setup  ← NEW
→ configure_optimizers
→ on_fit_start

Example

class MyFinetuningCallback(Callback):
    def on_before_configure_optimizers(self, trainer, pl_module):
        # freeze parameters before optimizer creation
        for param in pl_module.backbone.parameters():
            param.requires_grad = False

Use case

BaseFinetuning’s freeze_before_training() must execute before optimizers are created.
Previously, there was no hook between configure_model() and configure_optimizers(), making it incompatible with modules that instantiate submodules in configure_model().

Before submitting
  • Was this discussed/agreed via a GitHub issue? (not for typos and docs)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together?
  • Did you make sure to update the documentation with your changes? (if necessary)
  • Did you write any new necessary tests? (not for typos and docs)
  • Did you verify new and existing tests pass locally with your changes?
  • Did you list all the breaking changes introduced by this pull request?
  • Did you update the CHANGELOG? (not for typos, docs, test updates, or minor internal changes/refactors)

PR review

Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:

Reviewer checklist
  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified

📚 Documentation preview 📚: https://pytorch-lightning--21272.org.readthedocs.build/en/21272/

@github-actions github-actions bot added the pl Generic label for PyTorch Lightning package label Oct 6, 2025
@github-actions github-actions bot added the docs Documentation related label Oct 6, 2025
@deependujha deependujha requested a review from Copilot October 6, 2025 17:30
Copy link
Contributor

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR introduces a new callback hook on_before_optimizer_setup to provide a safe point between configure_model() and configure_optimizers(). The hook is designed to enable callbacks like BaseFinetuning to safely modify model parameters (such as freezing) before optimizers are created, particularly for LightningModules that instantiate submodules within configure_model().

  • Adds the new on_before_optimizer_setup hook to both callback and module hook interfaces
  • Updates the trainer execution flow to call this hook only during fitting stage
  • Provides comprehensive test coverage for the new hook timing and behavior

Reviewed Changes

Copilot reviewed 5 out of 5 changed files in this pull request and generated 1 comment.

Show a summary per file
File Description
src/lightning/pytorch/trainer/trainer.py Added hook invocation in trainer's _run method during fitting stage
src/lightning/pytorch/core/hooks.py Defined the on_before_optimizer_setup hook for LightningModule
src/lightning/pytorch/callbacks/callback.py Defined the on_before_optimizer_setup hook for Callback interface
tests/tests_pytorch/callbacks/test_callback_hooks.py Added comprehensive test for hook timing and behavior
docs/source-pytorch/common/hooks.rst Updated documentation to include the new hook in the execution order

Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.

Copy link
Contributor

@GdoongMathew GdoongMathew left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lgtm!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
docs Documentation related pl Generic label for PyTorch Lightning package
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants